Skip to content

Conversation

@robertlong
Copy link

This PR changes the internals of the fetch implementation such that fetch_read_body uses the Channel API to stream http responses instead of waiting for the entire response to finish and then returning the final ArrayBuffer.

Two things to note:

  1. The Channel<&[u8]> type is serializing as a number[] instead of ArrayBuffer or Uint8Array. I'm not certain that the Channel supports serializing as ArrayBuffers? I think this could be a possible performance improvement.

  2. The Channel API implementation does not guarantee that all events are received by the time the invoke promise is resolved. This is unfortunate, but I worked around it by returning an empty array over the Channel which is used to close the ReadableStream. I think this design works alright, but maybe there's a cleaner way to do this.

Fixes #2129

@robertlong robertlong requested a review from a team as a code owner December 4, 2024 21:28
@github-actions
Copy link
Contributor

github-actions bot commented Dec 4, 2024

Package Changes Through 3f9f336

No changes.

Add a change file through the GitHub UI by following this link.


Read about change files or the docs at github.com/jbolda/covector

@amrbashir amrbashir linked an issue Dec 8, 2024 that may be closed by this pull request
Copy link
Member

@amrbashir amrbashir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, could you also add a change file in .changes directory?

statusText
// If the promise fails, make sure the stream is closed
readPromise.catch((e) => {
console.error('error reading body', e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we really need to console.error here? won't controller.error be enough?


let res = Arc::into_inner(res).unwrap().0;
Ok(tauri::ipc::Response::new(res.bytes().await?.to_vec()))
let mut stream = res.bytes_stream();
Copy link
Member

@amrbashir amrbashir Dec 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can't we do the same streaming behavior using Response::chunk method, instead of pulling another crate to iterate over the stream?

@FabianLars
Copy link
Member

positive results on discord: https://discord.com/channels/616186924390023171/1344666371316908063 (not rushing anyone, i still think the review comments need to be acknowledged)

@livefantasia
Copy link

hello, is there any progress on this PR? Waiting for it to enable LLM streaming

@FabianLars
Copy link
Member

still waiting on reactions to the review comments 🤷

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[feat] Progress reporting for HTTP API in JS [bug] plugin-http is unable to perform streaming HTTP requests

4 participants